Variational Tobit Gaussian Process Regression

نویسندگان

چکیده

Abstract We propose a variational inference-based framework for training Gaussian process regression model subject to censored observational data. Data censoring is typical problem encountered during the data gathering procedure and requires specialized techniques perform inference since resulting probabilistic models are typically analytically intractable. In this article we exploit sparse inducing variable local methods compute an tractable lower bound on true log marginal likelihood of which can be used Bayesian inference. demonstrate proposed synthetically-produced, noise-corrupted data, as well real-world set, artificial censoring. The predictions comparable existing account censoring, but provides significant reduction in computational cost.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Heteroscedastic Gaussian Process Regression

Standard Gaussian processes (GPs) model observations’ noise as constant throughout input space. This is often a too restrictive assumption, but one that is needed for GP inference to be tractable. In this work we present a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under inputdependent noise conditions). Computational cost is roughly twic...

متن کامل

Asynchronous Distributed Variational Gaussian Process for Regression

Gaussian processes (GPs) are powerful nonparametric function estimators. However, their applications are largely limited by the expensive computational cost of the inference procedures. Existing stochastic or distributed synchronous variational inferences, although have alleviated this issue by scaling up GPs to millions of samples, are still far from satisfactory for real-world large applicati...

متن کامل

Incremental Variational Sparse Gaussian Process Regression

Recent work on scaling up Gaussian process regression (GPR) to large datasets has primarily focused on sparse GPR, which leverages a small set of basis functions to approximate the full Gaussian process during inference. However, the majority of these approaches are batch methods that operate on the entire training dataset at once, precluding the use of datasets that are streaming or too large ...

متن کامل

Efficient Variational Inference for Gaussian Process Regression Networks

In multi-output regression applications the correlations between the response variables may vary with the input space and can be highly non-linear. Gaussian process regression networks (GPRNs) are flexible and effective models to represent such complex adaptive output dependencies. However, inference in GPRNs is intractable. In this paper we propose two efficient variational inference methods f...

متن کامل

Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression

We introduce a novel variational method that allows to approximately integrate out kernel hyperparameters, such as length-scales, in Gaussian process regression. This approach consists of a novel variant of the variational framework that has been recently developed for the Gaussian process latent variable model which additionally makes use of a standardised representation of the Gaussian proces...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistics and Computing

سال: 2023

ISSN: ['0960-3174', '1573-1375']

DOI: https://doi.org/10.1007/s11222-023-10225-3